G - Prop - III : Global Optimization of Multilayer Perceptrons using anEvolutionary

نویسندگان

  • V. Rivas
  • G. Romero
چکیده

This paper proposes a new version of a method (G-Prop-III, genetic backpropaga-tion) that attempts to solve the problem of nding appropriate initial weights and learning parameters for a single hidden layer Mul-tilayer Perceptron (MLP) by combining a genetic algorithm (GA) and backpropagation (BP). The GA selects the initial weights and the learning rate of the network, and changes the number of neurons in the hidden layer through the application of speciic genetic operators. Besides, this new version of the algorithm includes BP training as a mutation operator. G-Prop-III combines the advantages of the global search performed by the GA over the MLP parameter space and the local search of the BP algorithm. The application of the G-Prop-III algorithm to several real-world and benchmark problems shows that MLPs evolved using G-Prop-III are smaller and achieve a higher level of generalization than other perceptron training algorithms, such as QuickPropagation or RPROP, and other evolutive algorithms, such as G-LVQ. It also shows some improvement over previous versions of the algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

G-Prop: Global optimization of multilayer perceptrons using GAs

A general problem in model selection is to obtain the right parameters that make a model "t observed data. For a multilayer perceptron (MLP) trained with back-propagation (BP), this means "nding appropiate layer size and initial weights. This paper proposes a method (G-Prop, genetic backpropagation) that attempts to solve that problem by combining a genetic algorithm (GA) and BP to train MLPs w...

متن کامل

G-Prop-II: Global Optimization of Multilayer Perceptrons using GAs

A general problem in model selection is to obtain the right parameters that make a model fit observed data. For a Multilayer Perceptron (MLP) trained with Backpropagation (BP), this means finding the right hidden layer size, appropriate initial weights and learning parameters. This paper proposes a method (G-Prop-II) that attempts to solve that problem by combining a genetic algorithm (GA) and ...

متن کامل

Comparing Hybrid Systems to Design and Optimize Artificial Neural Networks

In this paper we conduct a comparative study between hybrid methods to optimize multilayer perceptrons: a model that optimizes the architecture and initial weights of multilayer perceptrons; a parallel approach to optimize the architecture and initial weights of multilayer perceptrons; a method that searches for the parameters of the training algorithm, and an approach for cooperative co-evolut...

متن کامل

SA - Prop : Optimization of Multilayer PerceptronParameters using Simulated

A general problem in model selection is to obtain the right parameters that make a model t observed data. If the model selected is a Multilayer Perceptron (MLP) trained with Backpropagation (BP), it is necessary to nd appropriate initial weights and learning parameters. This paper proposes a method that combines Simulated Annealing (SimAnn) and BP to train MLPs with a single hidden layer, terme...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007